|
In the mathematical field of numerical analysis, Runge's phenomenon is a problem of oscillation at the edges of an interval that occurs when using polynomial interpolation with polynomials of high degree over a set of equispaced interpolation points. It was discovered by Carl David Tolmé Runge (1901) when exploring the behavior of errors when using polynomial interpolation to approximate certain functions.〔 available at (www.archive.org )〕 The discovery was important because it shows that going to higher degrees does not always improve accuracy. The phenomenon is similar to the Gibbs phenomenon in Fourier series approximations. ==Introduction== The Weierstrass approximation theorem states that for every continuous function ''f''(''x'') defined on an interval (), there exists a set of polynomial functions ''P''''n''(''x'') for ''n''=0, 1, 2, …, each of degree ''n'', that approximates ''f''(''x'') with uniform convergence over () as ''n'' tends to infinity, that is, : Consider the case where one desires to interpolate through ''n''+1 equispaced points of a function ''f''(''x'') using the ''n''-degree polynomial ''P''''n''(''x'') that passes through those points. Naturally, one might expect from Weierstrass' theorem that using more points would lead to a more accurate reconstruction of ''f''(''x''). However, this ''particular'' set of polynomial functions ''P''''n''(''x'') is not guaranteed to have the property of uniform convergence; the theorem only states that a set of polynomial functions exists, without providing a general method of finding one. The ''P''''n''(''x'') produced in this manner may in fact diverge away from ''f''(''x'') as ''n'' increases; this typically occurs in an oscillating pattern that magnifies near the ends of the interpolation points. This phenomenon is attributed to Runge. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Runge's phenomenon」の詳細全文を読む スポンサード リンク
|